skip to main content


Search for: All records

Creators/Authors contains: "Nearing, Grey S."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
  2. null (Ed.)
  3. Abstract

    Because of the possibility of getting the right answers for the wrong reasons, the predictive performance of a complex systems model is not by itself a reliable indicator of hypothesis quality for the purposes of scientific learning about processes. The predictive performance of a structurally adequate model should be an emergent property of its functional performance. In this context, any Pareto trade‐off between measures of predictive performance versus functional performance indicates process‐level error in the model; this trade‐off, if it exists, indicates that the model's predictions are right for the wrong functional reasons. This paper demonstrates a novel concept based on information theory that is capable of attributing observed errors to specific processes. To demonstrate that the concept and method hold true for models and observations of real systems, we employ a minimal single‐parameter‐variation sensitivity analysis using a sophisticated ecohydrology model, MLCan, for a well‐monitored field site (Bondville IL Ameriflux Soybean). We identify both functional and predictive error in MLCan, and also evidence of the hypothesized tradeoffs between the two. This trade‐off indicates structural error within MLCan. For example, the sensible heat flux process can be calibrated to achieve good predictive performance at the cost of poor functional performance. In contrast, we find little structural error for processes driven by solar radiation, which appear “right for the right reasons.” This method could be applied broadly to pinpoint process error and structural error in a wide range of system models, beyond the ecohydrological scope demonstrated here.

     
    more » « less
  4. We propose a conceptual and theoretical foundation for information-based model benchmarking and process diagnostics that provides diagnostic insight into model performance and model realism. We benchmark against a bounded estimate of the information contained in model inputs to obtain a bounded estimate of information lost due to model error, and we perform process-level diagnostics by taking differences between modeled versus observed transfer entropy networks. We use this methodology to reanalyze the recent Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) land model intercomparison project that includes the following models: CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, and ORCHIDEE. We report that these models (i) use only roughly half of the information available from meteorological inputs about observed surface energy fluxes, (ii) do not use all information from meteorological inputs about long-term Budyko-type water balances, (iii) do not capture spatial heterogeneities in surface processes, and (iv) all suffer from similar patterns of process-level structural error. Because the PLUMBER intercomparison project did not report model parameter values, it is impossible to know whether process-level error patterns are due to model structural error or parameter error, although our proposed information-theoretic methodology could distinguish between these two issues if parameter values were reported. We conclude that there is room for significant improvement to the current generation of land models and their parameters. We also suggest two simple guidelines to make future community-wide model evaluation and intercomparison experiments more informative.

     
    more » « less